The Rise of Data-Driven Investing

5 min read

When JCPenney reported its financial results for 2Q 2015, it came as quite a shock to the market.

The US retail giant outperformed most analyst expectations, which subsequently led to a 10% surge in its stock price over the next 2 days. But not everyone was caught off guard…

RSMetrics, a Big Data intelligence firm for businesses and investors, used satellite imagery of JCPenney parking lots during the quarter to confirm that traffic into its stores across the country was in fact increasing.

The firm’s clients (mostly hedge funds) who paid to obtain this satellite imagery could thus deduce, virtually in real-time, that JC Penney’s performance was on the up. And many of them ultimately capitalized on this information by buying JCPenney stock well before the release of the company’s Q2 report in August – and well before the 10% price jump.

The example illustrates just how useful – if not vital – data such as satellite imagery is now proving in exploiting investment opportunities well ahead of the rest of the market. Such data was simply not being identified a few years ago; indeed, financial analysts using traditional financial models wouldn’t even have acknowledged its existence.

Thankfully today, we don’t have to settle for just relying on theory. As well as financial modeling has served us over decades, we now have something more granular, more attuned to the real world, and more prescient than any chart or news eventually reveals.

Lots and lots of data

“All models are wrong, but some are useful” – George E.P. Box, British statistician.

Investment models have gone a long way towards the development and maturation of financial markets. And they have rightfully earned their creators accolades along the way. But truth be told, almost all models have at least one identifiable deficiency. And invariably, it is found in the assumptions they make – assumptions that don’t necessarily hold up in the real world.

Take the Capital Asset Pricing Model (CAPM), which was then a ground-breaking model for determining the required rate of return on an asset, given an already well-diversified portfolio. CAPM has proven to exhibit shortcomings when measured against empirical outcomes.

For example, it assumes investors can borrow and lend at a “risk-free rate,” a parameter that is often proxied by the yield on government debt. But borrowing/lending rates, in reality, are rarely on par with government borrowing rates. What’s more, government debt yield does not remain constant as assumed by CAPM. And finding a suitably accurate proxy for βi, which measures systemic, undiversifiable risk, has consistently proven challenging.

Applying such shortcomings to the real world can prove potentially disastrous. If banks, hedge funds, and investors are relying too much on their equations, they are vulnerable to consequences stemming from those false assumptions and oversimplifications.

Professor Ian Stewart, whose book argues that the Black Scholes failed to account for the 2008 financial crash, lamented “The equation is based on the idea that big movements are actually very, very rare. The problem is that real markets have these big changes much more often than this model predicts. And the other problem is that everyone’s following the same mathematical principles, so they’re all going to get the same answer.”

It’s worth acknowledging that Black-Scholes was designed in 1973, in the same year that the CBOE authorized call option trading on a mere 16 underlying stocks. As the options universe has become more diverse and complex since then, have valuation models like Black-Scholes sufficiently adapted to keep pace? I’d suggest not.

And then, of course, there are the anomalies, irrationalities, and cognitive and emotional biases that render markets inefficient. All of which theoretical equations fail to capture.

All models have shortcomings. Taking a few steps back, it’s worth asking why we even use them? Because models reflect a need to conceptualize conditions and dynamics of all conceivable economic forces and their outcomes.

But why bother conceptualizing, then? Because educated guess has a market price, in the form of advisory, textbooks, and the formulae you put inside the cell of a spreadsheet.

By conceptualizing economic interactions under uncertainty, there are 2 ways how it might be accomplished. One is through a set of assumptions, the other is by using simulations through which we can clearly isolate classifications and patterns.

For decades, we have relied on the former approach due to having limited computational resources at our disposal. But as these resources become cheaper, along with data storage and the variety of data, we can arrive at a better understanding of reality by using the latter approach.

The rise of data-driven investing

“Your kings of the universe are no longer the folks wearing suits and going to galas. It’s the folks that are crunching Python and going to meet-ups. These are becoming the new masters of the universe.” Gene Ekster, CFA; originator of the term “alternative data”.

Data-Driven investing builds on what models can achieve by enabling investors to achieve significantly more granularity from their analyses. Through increasingly sophisticated techniques that can capture huge quantities of data, we now have the means to reveal behavior, trends, and patterns of enormous relevance when gauging the appeal of a potential investment.

Known as alternative data when applied to investing, there’s seemingly no limit to what kinds of information can be extracted. Whether it’s credit card data allowing us to verify what consumers are purchasing; geolocation data that can track cell phones, or data scraped from airline websites that can tell us whether or not to invest in the travel industry, these non-traditionally sourced datasets are facilitating much greater insights into potential investment targets.

Data-driven vs. Model-driven

Simply put, the more relevant data we have to hand, the more informed our decision-making becomes. Narang’s Inside the Black Box identifies 2 distinct advantages of using a data-driven approach:

  1. Data mining is much less widely practiced than theory-based investing, so there is a far lower number of competitors. It is also more technically complex which creates more rigid barriers to entry into this field.
  2. Data-driven strategies can identify trends, patterns and behaviors that may not necessarily “fly under the flag” of an existing theory. This expands the universe of potential outcomes. Model-driven strategies, in contrast, are limited to capturing results that must be explained by existing equations and formulae, and thus generates a comparatively narrow set of conclusions.

Today, practitioners of data-driven investing are now identifying a range of new investment themes directly as a result of this new and improved accessibility to big data. Takashi Suwabe a portfolio manager of Quantitative Investment Strategies at Goldman Sachs Asset Management, observes three themes in particular: Momentum, Value, and Profitability.

But it’s no use simply having more data at one’s disposal – the ability to harness this data and generate useful insights must similarly evolve…

Data and technology: a match made in heaven

Given such a vast degree of data proliferation – more than any one investor can reasonably handle – the power to process all this data becomes crucial. Thankfully, that processing power is increasing all the time.

Computing techniques are allowing seemingly disparate, and often somewhat abstract data to be collated, analyzed and structured into easier-to-interpret formats. And data storage technology is also on the rise, allowing highly-scalable processes to be applied to enormous quantities of data.

According to data analytics platform Novus, three core structural fundamentals are needed to analyze data effectively:

  1. Data Foundation – an infrastructure that effortlessly collects, stores, aggregates, and normalizes all of your data sources automatically.
  2. Analytics Engine – an intelligent analytics engine that allows you to pull, combine, screen, and analyze all your data in various ways.
  3. Technology – the technology necessary to extract insight from your data and analytics by simply logging into a system.

But data-driven methodology remains far from perfect…

The notion of “garbage in – garbage out” remains a pertinent one. There’s no point in having lots of data to hand for the sake of it. Investors still have to decide whether it is ultimately relevant and whether useful insights can be gleaned from it. If data is fed into the system that has no connection to what we are attempting to predict, then spurious results will follow.

Furthermore, evidence suggests that not everyone is correctly utilizing the massive amounts of data available. As former Hewlett-Packard CEO Carly Fiorina once succinctly observed, “The goal is to turn data into information, and information into insight.” A lot of data may end up being ultimately meaningless, while even valuable data will require laborious treatment to ensure it is free of errors.

It is also worth verifying whether computers are processing data quickly enough to ensure it remains meaningful. According to Narang, data mining strategies require “nearly constant adjustment to keep up with the changes going on in markets, an activity that has many risks in itself”. As hardware continues to evolve, however, one would expect this concern to be more ably managed.

In the search for yield, data is proving to offer a competitive edge to investors, transforming the way in which decisions are being reached. In response, asset managers are redesigning their operating models, hiring data specialists by the bucketload, and overhauling their data-handling infrastructure, including the industry’s biggest hitters such as Winton and Two Sigma. It seems only a matter of time before the entire cross-section of investors wakes up to the enormous potential of data for helping them make smarter investment decisions.

But is data-driven investing rendering financial models obsolete? Not quite. At this stage, it would seem the two are augmenting each other to provide more robust analysis. As observed by Third Point’s founder Dan Loeb, “When we add in the use of data sets and ”quantamental” techniques [a hybrid strategy involving fundamental analysis with quantitative investment models] that are increasingly important to remain competitive while investing in single-name equities, it is clear that our business is rapidly evolving”.

Dr Justin Chan Dr Chan founded DataDrivenInvestor.com (DDI) and is the CEO for JCube Capital Partners. Specialized in strategy development, alternative data analytics and behavioral finance, Dr Chan also has extensive experience in investment management and financial services industries. Prior to forming JCube and DDI, Dr Chan served in the capacity of strategy development in multiple hedge funds, fintech companies, and also served as a senior quantitative strategist at GMO. A published author at professional journals in finance, Dr. Chan holds a Ph.D. degree in finance from UCLA.

Leave a Reply

Your email address will not be published. Required fields are marked *